Convergence Analysis of Restarted Krylov Subspace Eigensolvers

نویسندگان

  • Klaus Neymeyr
  • Ming Zhou
چکیده

The A-gradient minimization of the Rayleigh quotient allows to construct robust and fastconvergent eigensolvers for the generalized eigenvalue problem for (A,M) with symmetric and positive definite matrices. The A-gradient steepest descent iteration is the simplest case of more general restarted Krylov subspace iterations for the special case that all step-wise generated Krylov subspaces are twodimensional. This paper contains a convergence analysis of restarted Krylov subspace iterations for the minimization of the Rayleigh quotient with Krylov subspaces of arbitrary dimensions. The eigenpair approximations, namely the Ritz vector and the Ritz value, are extracted in each step of the iteration by the Rayleigh-Ritz procedure. The new convergence analysis provides a sharp Ritz vector estimate together with a Ritz value estimate. These results improve the classical estimates by Kaniel and Saad (1966, 1980) and Parlett (1980) and generalize a result from Knyazev (1987).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sharp Ritz Value Estimates for Restarted Krylov Subspace Iterations

Gradient iterations for the Rayleigh quotient are elemental methods for computing the smallest eigenvalues of a pair of symmetric and positive definite matrices. A considerable convergence acceleration can be achieved by preconditioning and by computing Rayleigh-Ritz approximations from subspaces of increasing dimensions. An example of the resulting Krylov subspace eigensolvers is the generaliz...

متن کامل

A Restarted Krylov Subspace Method for the Evaluation of Matrix Functions

We show how the Arnoldi algorithm for approximating a function of a matrix times a vector can be restarted in a manner analogous to restarted Krylov subspace methods for solving linear systems of equations. The resulting restarted algorithm reduces to other known algorithms for the reciprocal and the exponential functions. We further show that the restarted algorithm inherits the superlinear co...

متن کامل

Some new restart vectors for explicitly restarted Arnoldi method

The explicitly restarted Arnoldi method (ERAM) can be used to find some eigenvalues of large and sparse matrices. However, it has been shown that even this method may fail to converge. In this paper, we present two new methods to accelerate the convergence of ERAM algorithm. In these methods, we apply two strategies for the updated initial vector in each restart cycles. The implementation of th...

متن کامل

On preconditioned eigensolvers and Invert-Lanczos processes

This paper deals with the convergence analysis of various preconditioned iterations to compute the smallest eigenvalue of a discretized self-adjoint and elliptic partial differential operator. For these eigenproblems several preconditioned iterative solvers are known, but unfortunately, the convergence theory for some of these solvers is not very well understood. The aim is to show that precond...

متن کامل

Toward Restarting Strategies Tuning for a Krylov Eigenvalue Solver

Krylov eigensolvers are used in many scientific fields, such as nuclear physics, page ranking, oil and gas exploration, etc... In this paper, we focus on the ERAM Krylov eigensolver whose convergence is strongly correlated to the Krylov subspace size and the restarting vector v0, a unit norm vector. We focus on computing the restarting vector v0 to accelerate the ERAM convergence. First, we stu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM J. Matrix Analysis Applications

دوره 37  شماره 

صفحات  -

تاریخ انتشار 2016